discriminative sparse kernel machine
- North America > Canada (0.04)
- Asia > Middle East > Jordan (0.04)
- North America > Canada (0.04)
- Asia > Middle East > Jordan (0.04)
Reviews: Integrating Bayesian and Discriminative Sparse Kernel Machines for Multi-class Active Learning
Originality: The combination of sampling in areas of'greater interest' while adjusting to the underlying distribution appears in many active learning works, but the objective in (1) is novel and approaching both in a unified framework is challenging. The lower bounding of the optimization problem is also new Quality: The experimental results are very thorough and show the improvement of the proposed method over random sampling as well as several other baselines. And the exploration of effect of tuning parameters and initial sample size is excellent. However the theoretical contributions appear incomplete. The significant theoretical contribution is the (mislabelled) Theorem 2, and both the statement and proof of this is extremely informal.
Reviews: Integrating Bayesian and Discriminative Sparse Kernel Machines for Multi-class Active Learning
The paper proposed a novel algorithm for active learning in the multi class setting. The authors present a theoretical guarantee regarding the sparseness of the model as well as empirical evaluation across 6 datasets and comparing with 5 baseline methods. All reviewers tend for vote for acceptance, but do point out several areas of improvement and the authors provide feedback for. I strongly expect the final version of the paper to include the changes that address: - Formal statement and proof outline for Theorem 2. - Include the comparison of RVM, SVM and KMC method in the passive learning setting (mentioned in the author feedback), in order to help distinguish the benefit of the novel model alone, in addition to the combination of model and active sampling. Including those results would significantly increase the value of the study.
Integrating Bayesian and Discriminative Sparse Kernel Machines for Multi-class Active Learning
We propose a novel active learning (AL) model that integrates Bayesian and discriminative kernel machines for fast and accurate multi-class data sampling. By joining a sparse Bayesian model and a maximum margin machine under a unified kernel machine committee (KMC), the proposed model is able to identify a small number of data samples that best represent the overall data space while accurately capturing the decision boundaries. The integration is conducted using the maximum entropy discrimination framework, resulting in a joint objective function that contains generalized entropy as a regularizer. Such a property allows the proposed AL model to choose data samples that more effectively handle non-separable classification problems. Parameter learning is achieved through a principled optimization framework that leverages convex duality and sparse structure of KMC to efficiently optimize the joint objective function.
Integrating Bayesian and Discriminative Sparse Kernel Machines for Multi-class Active Learning
We propose a novel active learning (AL) model that integrates Bayesian and discriminative kernel machines for fast and accurate multi-class data sampling. By joining a sparse Bayesian model and a maximum margin machine under a unified kernel machine committee (KMC), the proposed model is able to identify a small number of data samples that best represent the overall data space while accurately capturing the decision boundaries. The integration is conducted using the maximum entropy discrimination framework, resulting in a joint objective function that contains generalized entropy as a regularizer. Such a property allows the proposed AL model to choose data samples that more effectively handle non-separable classification problems. Parameter learning is achieved through a principled optimization framework that leverages convex duality and sparse structure of KMC to efficiently optimize the joint objective function.